Nearest neighbor classifier generalization through spatially constrained filters
نویسندگان
چکیده
It is widely understood that the performance of the nearest neighbor (NN) rule is dependent on: (i) the way distances are computed between di↵erent examples, and (ii) the type of feature representation used. Linear filters are often used in computer vision as a pre-processing step, to extract useful feature representations. In this paper we demonstrate an equivalence between (i) and (ii) for NN tasks involving weighted Euclidean distances. Specifically, we demonstrate how the application of a bank of linear filters can be re-interpreted, in the form of a symmetric weighting matrix, as a manipulation of how distances are computed between di↵erent examples for NN classification. Further, we argue that filters fulfill the role of encoding local spatial constraints into this weighting matrix. We then demonstrate how these constraints can dramatically increase the generalization capability of canonical distance metric learning techniques in the presence of unseen illumination and viewpoint change.
منابع مشابه
Improving k Nearest Neighbor with Exemplar Generalization for Imbalanced Classification
A k nearest neighbor (kNN) classifier classifies a query instance to the most frequent class of its k nearest neighbors in the training instance space. For imbalanced class distribution, a query instance is often overwhelmed by majority class instances in its neighborhood and likely to be classified to the majority class. We propose to identify exemplar minority class training instances and gen...
متن کاملThe performance of a new hybrid classifier based on boxes and nearest neighbors
In this paper we present a new type of binary classifier defined on the unit cube. This classifier combines some of the aspects of the standard methods that have been used in the logical analysis of data (LAD) and geometric classifiers, with a nearest-neighbor paradigm. We assess the predictive performance of the new classifier in learning from a sample, obtaining generalization error bounds th...
متن کاملPredicting noise filtering efficacy with data complexity measures for nearest neighbor classification
Classifier performance, particularly of instance-based learners such as k-nearest neighbors, is affected by the presence of noisy data. Noise filters are traditionally employed to remove these corrupted data and improve the classification performance. However, their efficacy depends on the properties of the data, which can be analyzed by what are known as data complexity measures. This paper st...
متن کاملComparing pixel-based and object-based algorithms for classifying land use of arid basins (Case study: Mokhtaran Basin, Iran)
In this research, two techniques of pixel-based and object-based image analysis were investigated and compared for providing land use map in arid basin of Mokhtaran, Birjand. Using Landsat satellite imagery in 2015, the classification of land use was performed with three object-based algorithms of supervised fuzzy-maximum likelihood, maximum likelihood, and K-nearest neighbor. Nine combinations...
متن کاملrknn: an R Package for Parallel Random KNN Classification with Variable Selection
Random KNN (RKNN) is a novel generalization of traditional nearest-neighbor modeling. Random KNN consists of an ensemble of base k-nearest neighbor models, each constructed from a random subset of the input variables. A collection of r such base classifiers is combined to build the final Random KNN classifier. Since the base classifiers can be computed independently of one another, the overall ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Pattern Recognition
دوره 46 شماره
صفحات -
تاریخ انتشار 2013